Choosing Between Synchronous and Asynchronous Integration for Dynamics 365
When working with Dynamics 365, one of the key decisions during integration design is whether to implement synchronous or asynchronous communication. Understanding the differences and use cases for each approach is critical to building reliable, efficient, and scalable integrations. Understanding the Difference When to Use Synchronous Integration Synchronous integration is appropriate when: Advantages: Immediate confirmation, straightforward error detection.Considerations: Can slow down the system if the target application experiences latency, less scalable for high-volume scenarios. When to Use Asynchronous Integration Asynchronous integration is better suited for scenarios where: Advantages: Highly scalable, non-blocking operations, suitable for batch processing.Considerations: Errors may not be detected immediately, and tracking processing status requires additional monitoring. Real-World Examples Decision-Making Approach When evaluating which approach to use, consider these questions: To conclude, both synchronous and asynchronous integrations have distinct advantages and trade-offs. Synchronous workflows provide real-time feedback and simpler error handling, while asynchronous workflows offer scalability and efficiency for high-volume or non-urgent processes. Selecting the right approach for your Dynamics 365 integration requires careful consideration of business requirements, data volume, and system performance. By aligning the integration method with these factors, you can ensure reliable, efficient, and maintainable integrations. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
Simplifying File-Based Integrations for Dynamics 365 with Azure Blob and Logic Apps
Integrating external systems with Dynamics 365 often involves exchanging files like CSVs or XMLs between platforms. Traditionally, these integrations require custom code, complex workflows, or manual intervention, which increases maintenance overhead and reduces reliability. Thankfully, leveraging Azure Blob Storage and Logic Apps can streamline file-based integrations, making them more efficient, scalable, and easier to maintain. Why File-Based Integrations Are Still Common While APIs are the preferred method for system integration, file-based methods remain popular in many scenarios: The challenge comes in orchestrating file movement, transforming data, and ensuring it reaches Dynamics 365 reliably. Enter Azure Blob Storage Azure Blob Storage is a cloud-based object storage solution designed for massive scalability. When used in file-based integrations, it acts as a reliable intermediary: Orchestrating with Logic Apps Azure Logic Apps is a low-code platform for building automated workflows. It’s particularly useful for integrating Dynamics 365 with file sources: Real-Time Example: Automating Sales Order Uploads Traditional Approach: Solution Using Azure Blob and Logic Apps: Outcome: Best Practices Benefits To conclude, file-based integrations no longer need to be complicated or error-prone. By leveraging Azure Blob Storage for reliable file handling and Logic Apps for automated workflows, Dynamics 365 integrations become simpler, more maintainable, and scalable. The real-time sales order example shows that businesses can save time, reduce errors, and ensure data flows seamlessly between systems allowing teams to focus on their core operations rather than manual file processing. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
Essential Integration Patterns for Dynamics 365 Using Azure Logic Apps
If you’ve worked on Dynamics 365 CRM projects, you know integration isn’t optional—it’s essential. Whether you’re connecting CRM with a legacy ERP, a cloud-based marketing tool, or a SharePoint document library, the way you architect your integrations can make or break performance and maintainability. Azure Logic Apps makes this easier with its low code interface but using the right pattern matters. In this post, I’ll Walk through seven integration patterns I’ve seen in real projects, explain where they work best, and share some lessons from the field. Whether you’re building real-time syncs, scheduled data pulls, or hybrid workflows using Azure functions, these patterns will help you design cleaner, smarter solutions. A Common Real-World Scenario Let’s say you’re asked to sync Project Tasks from Dynamics 365 to an external project management system. The sync needs to be quick, reliable, and avoid sending duplicate data. You might wonder: Without a clear integration pattern, you might end up with brittle flows that break silently or overload your system. Key Integration Patterns (With Real Use Cases) 1. Request-Response Pattern What it is: A Logic App that waits for a request (usually via HTTP), processes it, and sends back a response. Use Case: You’re building a web or mobile app that pulls data from CRM in real time—like showing a customer’s recent orders. How it works: Why use it: Key Considerations: 2. Fire-and-Forget Pattern What it is: CRM pushes data to a Logic App when something happens. The Logic App does the work—but no one waits for confirmation. Use Case: When a case is closed in CRM, you archive the data to SQL or notify another system via email. How it works: Why use it: Keep users moving—no delays. Great for logging, alerts, or downstream updates Key Considerations: Silent failures—make sure you’re logging errors or using retries 3. Scheduled Sync (Polling) What it is: A Logic App that runs on a fixed schedule and pulls new/updated records using filters. Use Case: Every 30 minutes, sync new Opportunities from CRM to SAP. How it works: Why use it: Key Considerations: 4. Event-Driven Pattern (Webhooks) What it is: CRM triggers a webhook (HTTP call) when something happens. A Logic App or Azure Function listens and acts. Use Case: When a Project Task is updated, push that data to another system like MS Project or Jira. How it works: Why use it: Key Considerations: 5. Queue-Based Pattern What it is: Messages are pushed to a queue (like Azure Service Bus), and Logic Apps process them asynchronously. Use Case: CRM pushes lead data to a queue, and Logic Apps handle them one by one to update different downstream systems (email marketing, analytics, etc.) How it works: Why use it: Key Considerations: 6. Blob-Driven Pattern (File-Based Integration) What it is: Logic App watches a Blob container or SFTP location for new files (CSV, Excel), parses them, and updates CRM. Use Case: An external system sends daily contact updates via CSV to a storage account. Logic App reads and applies updates to CRM. How it works: Why use it: Key Considerations: 7. Hybrid Pattern (Logic Apps + Azure Functions) What it is: Logic App does the orchestration, while Azure Function handles complex logic that’s hard to do with built-in connectors. Use Case: You need to calculate dynamic pricing or apply business rules before pushing data to ERP. How it works: Why use it: Key Considerations: Implementation Tips & Best Practices Area Recommendation Security Use managed identity, OAuth, and Key Vault for secrets Error Handling Use “Scope” + “Run After” for retries and graceful failure responses Idempotency Track processed IDs or timestamps to avoid duplicate processing Logging Push important logs to Application Insights or a centralized SQL log Scaling Prefer event/queue-based patterns for large volumes Monitoring Use Logic App’s run history + Azure Monitor + alerts for proactive detection Tools & Technologies Used Common Architectures You’ll often see combinations of these patterns in real-world systems. For example: To conclude, integration isn’t just about wiring up connectors, it’s about designing flows that are reliable, scalable, and easy to maintain. These seven patterns are ones I’ve personally used (and reused!) across projects. Pick the right one for your scenario, and you’ll save yourself and your team countless hours in debugging and rework. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
Building a Scalable Integration Architecture for Dynamics 365 Using Logic Apps and Azure Functions
If you’ve worked with Dynamics 365 CRM for any serious integration project, you’ve probably used Azure Logic Apps. They’re great — visual, no-code, and fast to deploy. But as your integration needs grow, you quickly hit complexity: multiple entities, large volumes, branching logic, error handling, and reusability. That’s when architecture becomes critical. In this blog, I’ll share how we built a modular, scalable, and reusable integration architecture using Logic Apps + Azure Functions + Azure Blob Storage — with a config-driven approach. Whether you’re syncing data between D365 and Finance & Operations, or automating CRM workflows with external APIs, this post will help you avoid bottlenecks and stay maintainable. Architecture Components Component Purpose Parent Logic App Entry point, reads config from blob, iterates entities Child Logic App(s) Handles each entity sync (Project, Task, Team, etc.) Azure Blob Storage Hosts configuration files, Liquid templates, checkpoint data Azure Function Performs advanced transformation via Liquid templates CRM & F&O APIs Source and target systems Step-by-Step Breakdown 1. Configuration-Driven Logic We didn’t hardcode URLs, fields, or entities. Everything lives in a central config.json in Blob Storage: { “integrationName”: “ProjectToFNO”, “sourceEntity”: “msdyn_project”, “targetEntity”: “ProjectsV2”, “liquidTemplate”: “projectToFno.liquid”, “primaryKey”: “msdyn_projectid” } 2. Parent–Child Logic App Model Instead of one massive workflow, we created a parent Logic App that: Each child handles: 3. Azure Function for Transformation Why not use Logic App’s Compose or Data Operations? Because complex mapping (especially D365 → F&O) quickly becomes unreadable. Instead: { “ProjectName”: “{{ msdyn_subject }}”, “Customer”: “{{ customerid.name }}” } 4. Handling Checkpoints For batch integration (daily/hourly), we store last run timestamp in Blob: { “entity”: “msdyn_project”, “modifiedon”: “2025-07-28T22:00:00Z” } This allows delta fetches like: ?$filter=modifiedon gt 2025-07-28T22:00:00Z After each run, we update the checkpoint blob. 5. Centralized Logging & Alerts We configured: This helped us track down integration mismatches fast. Why This Architecture Works Need How It’s Solved Reusability Config-based logic + modular templates Maintainability Each Logic App has one job Scalability Add new entities via config, not code Monitoring Blob + Monitor integration Transformation complexity Handled via Azure Functions + Liquid Key Takeaways To conclude, this architecture has helped us deliver scalable Dynamics 365 integrations, including syncing Projects, Tasks, Teams, and Time Entries to F&O all without rewriting Logic Apps every time a client asks for a tweak. If you’re working on medium to complex D365 integrations, consider going config-driven and breaking your workflows into modular components. It keeps things clean, reusable, and much easier to maintain in the long run. I hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com.
Share Story :
When to Use Azure Data Factory vs Logic Apps in Dynamics 365 Integrations
You’re integrating Dynamics 365 CRM with other systems—but you’re confused:Should I use Azure Data Factory or Logic Apps?Both support connectors, data transformation, and scheduling—but serve different purposes. When you’re working on integrating Dynamics 365 with other systems, two Azure tools often come up: Azure Logic Apps and Azure Data Factory (ADF). I’ve been asked many times — “Which one should I use?” — and honestly, there’s no one-size-fits-all answer. Based on real-world experience integrating D365 CRM and Finance, here’s how I approach choosing between Logic Apps and ADF. When to Use Logic Apps Azure Logic Apps is ideal when your integration involves: 1. Event-Driven / Real-Time Integration 2. REST APIs and Lightweight Automation 3. Business Process Workflows 4. Quick and Visual Flow Creation Azure Data Factory is better for: 1. Large Volume, Batch Data Movement 2. ETL / ELT Scenarios 3. Integration with Data Lakes and Warehouses 4. Advanced Data Flow Transformation Feature Comparison Table Feature Logic Apps Data Factory Trigger on Record Creation/Update Yes No (Batch Only) Handles APIs (HTTP, REST, OData) Excellent Limited Real-time Integration Yes No Large Data Volumes (Batch) Limited Excellent Data Lake / Warehouse Integration Basic (via connectors) Deep support Visual Workflow Visual Designer Visual (for Data Flows) Custom Code / Transformation Limited (use Azure Function) Strong via Data Flows Cost for High Volume Higher (Per Run) Cost-efficient for batch Real-World Scenarios 2. Use ADF When: To conclude, choose Logic Apps for real-time, low-volume, API-based workflows.Use Data Factory for batch ETL pipelines, high-volume exports, and reporting pipelines. Integrations in Dynamics 365 CRM aren’t one-size-fits-all—pick the right tool based on the data size, speed, and transformation needs. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com
Share Story :
How to Use Webhooks for Real-Time CRM Integrations in Dynamics 365
Are you looking for a reliable way to integrate Dynamics 365 CRM with external systems in real time? Polling APIs or scheduling batch jobs can delay updates and increase complexity. What if you could instantly notify external services when key events happen inside Dynamics 365? This blog will show you how to use webhooks—an event-driven mechanism to trigger real-time updates and data exchange with external services, making your integrations faster and more efficient. A webhook is a user-defined HTTP callback that is triggered by specific events. Instead of your external system repeatedly asking Dynamics 365 for updates, Dynamics 365 pushes updates to your service instantly. Dynamics 365 supports registering webhooks through plugin steps that execute when specific messages (create, update, delete, etc.) occur. This approach provides low latency and ensures that your external systems always have fresh data. This walkthrough covers the end-to-end process of configuring webhooks in Dynamics 365, registering them via plugins, and securely triggering external services. What You Need to Get Started Step 1: Create Your Webhook Endpoint Step 2: Register Your Webhook in Dynamics 365 Step 3: Create a Plugin to Trigger the Webhook public void Execute (IServiceProvider serviceProvider) { var notificationService = (IServiceEndpointNotificationService)serviceProvider.GetService(typeof(IServiceEndpointNotificationService)); Entity entity = (Entity)context.InputParameters[“Target”]; notificationService.Notify(“yourWebhookRegistrationName”, entity.ToEntityReference().Id.ToString()); } Register this plugin step for your message (Create, Update, Delete) on the entity you want to monitor. Step 4: Test Your Webhook Integration Step 5: Security and Best Practices Real-World Use Case A company wants to notify its external billing system immediately when a new invoice is created in Dynamics 365. By registering a webhook triggered by the Invoice creation event, the billing system receives data instantly and processes payment without delays or manual intervention. To conclude, webhooks offer a powerful way to build real-time, event-driven integrations with Dynamics 365, reducing latency and complexity in your integration solutions. We encourage you to start by creating a simple webhook endpoint and registering it with Dynamics 365 using plugins. This can transform how your CRM communicates with external systems. For deeper technical support and advanced integration patterns, explore CloudFront’s resources or get in touch with our experts to accelerate your Dynamics 365 integration project. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Top 5 Ways to Integrate Microsoft Dynamics 365 with Other Systems
When it comes to Microsoft Dynamics 365, one of its biggest strengths—and challenges—is how many ways there are to integrate it with other platforms. Whether you’re syncing with an ERP, pushing data to a data lake, or triggering notifications in Teams, the real question becomes: Which integration method should you choose? In this blog, we’ll break down the top 5 tools used by teams around the world to integrate Dynamics 365 with other systems. Each has its strengths, and each fits a different type of use case. 1. Power Automate – Best for Quick, No-Code Automations What it is: A low-code platform built into the Power Platform suite. When to use it: Internal automations, approvals, email notifications, basic integrations. Lesser-Known Tip: Power Automate runs on two plans—per user and per flow. If you have dozens of similar flows, the “per flow” plan can be more cost-effective than individual licenses. Advanced Feature: You can call Azure Functions or hosted APIs directly within a flow, effectively turning it into a lightweight integration framework. Pros: Cons: Example: When a new lead is created in D365, send an email alert and create a task in Outlook. 2. Azure Logic Apps – Best for Scalable Integrations What it is: A cloud-based workflow engine for system-to-system integrations. When to use it: Large-scale or backend integrations, especially when working with APIs. Lesser-Known Tip: Logic Apps come in two flavours—Consumption and Standard. The Standard tier offers VNET-integration, local development, and built-in connectors at a flat rate, which is ideal for predictable, high-throughput scenarios. Advanced Feature: Use Logic Apps’ built-in “Integration Account” to manage schemas, maps, and certificates for B2B scenarios (AS2, X12). Pros: Cons: Example: Sync Dynamics 365 opportunities with a SQL database in real time. 3. Data Export Service / Azure Synapse Link – Best for Analytics What it is: Tools to replicate D365 data into Azure SQL or Azure Data Lake. When to use it: Advanced reporting, Power BI, historical data analysis. Lesser-Known Tip: Data Export Service is being deprecated in flavours of Azure Synapse Link, which provides both near-real-time and “materialized view” patterns. You can even write custom analytics in Spark directly against your live CRM data. Advanced Feature: With Synapse Link, you can enable change data feed (CDC) and query Delta tables in Synapse, unlocking time-travel queries for historical analysis. Pros: Cons: Example: Export all account and contact data to Azure Synapse and visualize KPIs in Power BI. 4. Dual-write – Best for D365 F&O Integration What it is: A Microsoft-native framework to connect D365 CE (Customer Engagement) and D365 F&O (Finance & Operations). When to use it: Bi-directional, real-time sync between CRM and ERP. Lesser-Known Tip: Dual-write leverages the Common Data Service pipeline under the covers—so any customization (custom entities, fields) you add to Dataverse automatically flows through to F&O once you map it. Advanced Feature: You can extend dual-write with custom Power Platform flows to handle pre- or post-processing logic before records land in F&O. Pros: Cons: Example: Automatically sync customer and invoice records between D365 Sales and Finance. 5. Custom APIs & Webhooks – Best for Complex, Real-Time Needs What it is: Developer-driven integrations using HTTP APIs or Dynamics 365 webhooks. When to use it: External systems, fast processing, custom business logic. Lesser-Known Tip: Dynamics 365 supports registering multiple webhook subscribers on the same event. You can chain independent systems (e.g., call your middleware, then a monitoring service) without writing code. Advanced Feature: Combine webhooks with Azure Event Grid for enterprise-grade event routing, retry policies, and dead-lettering. Pros: Cons: Example: Trigger an API call to a shipping provider when a case status changes to “Ready to Ship.” To conclude, Microsoft Dynamics 365 gives you a powerful set of integration tools, each designed for a different type of business need. Whether you need something quick and simple (Power Automate), enterprise-ready (Logic Apps), or real-time and custom (Webhooks), there’s a solution that fits. Take a moment to evaluate your integration scenario. What systems are involved? How much data are you moving? What’s your tolerance for latency and failure? If you’re unsure which route to take, or need help designing and implementing your integrations, reach out to our team for a free consultation. Let’s make your Dynamics 365 ecosystem work smarter—together. We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
Getting Started with OData Queries in Microsoft Dynamics 365
Have you ever needed to pull data out of Dynamics 365 but didn’t know where to begin? Whether you’re building a report, wiring up a Power App, or feeding data into another system, OData is your friend. In just a few clicks, you’ll be able to write simple HTTP requests to retrieve exactly the records you want—no complex code required. What Is OData and Why It Matters OData (Open Data Protocol) is a standardized way to query RESTful APIs. Microsoft Dynamics 365 exposes its entire data model via OData, so you can: This means faster development and fewer custom endpoints. 1. Finding Your Web API Endpoint https://yourorg.crm.dynamics.com/api/data/v9.2 That’s your base URL for every OData call. 2. Exploring Entities via Metadata Append $metadata to your base URL: GET https://yourorg.crm.dynamics.com/api/data/v9.2/$metadata You’ll get an XML file listing all entities (contacts, accounts, leads, etc.), their fields, data types, and navigation properties. Tip: press Ctrl + F to search for your entity by name. 3. Core OData Query Options a. $select – Return Only What You Need GET https://yourorg.crm.dynamics.com/api/data/v9.2/contacts?$select=fullname,emailaddress1,jobtitle This limits the payload to just those three fields, making responses smaller and faster. b. $filter – Narrow Down Your Results GET https://yourorg.crm.dynamics.com/api/data/v9.2//contacts?$filter=firstname eq ‘Ankit’ Operators: eq (equals) ne (not equals) gt / lt (greater than / less than) Combine with and / or : GET https://yourorg.crm.dynamics.com/api/data/v9.2//contacts?$filter=statecode eq 0 and jobtitle eq ‘Consultant’ c. $orderby – Sort Your Data GET https://yourorg.crm.dynamics.com/api/data/v9.2/contacts?$orderby=createdon desc Newest records appear first. d. $top – Limit Record Count GET https://yourorg.crm.dynamics.com/api/data/v9.2/contacts?$top=5 Great for previews or testing. e. $expand – Fetch Related Records Example: Get each contact’s full name and its parent account name in one request: GET https://yourorg.crm.dynamics.com/api/data/v9.2/contacts? $select=fullname,parentcustomerid &$expand=parentcustomerid_account($select=name) parentcustomerid is the lookup field parentcustomerid_account is the navigation property Nested $select limits expanded fields Another example: Expand opportunities with customer account info: GET https://yourorg.crm.dynamics.com/api/data/v9.2/opportunities?$expand=customerid_account($select=name,accountnumber) Finding Expandable Names In your $metadata, look for lines like: <NavigationProperty Name=”parentcustomerid_account” Type=”Microsoft.Dynamics.CRM.account” /> Use that Name value in your $expand. Putting It All Together Suppose you want all active contacts at “Contoso” and their account names: GET https://yourorg.crm.dynamics.com/api/data/v9.2/contacts?$filter=statecode eq 0 &$expand=parentcustomerid_account($filter=name eq ‘Contoso’; $select=name)&$select=fullname,emailaddress1 Conclusion: OData might sound technical at first, but once you get the hang of it, it becomes one of the most powerful tools in your Dynamics 365 toolbox. Whether you’re building integrations, reports, or simple automations, OData gives you the flexibility to query exactly what you need—without relying on custom development. Start small. Open your environment, locate the Web API URL, and try your first $select or $filter query. Once you’re confident, move on to advanced options like $expand and $orderby. Call to Action: Need help designing smarter OData-based solutions or integrating with Power Platform tools? Reach out to our team today and we’ll help you build something great.
Share Story :
Creating and Accessing Blob Storage with Azure Data Factory: A Complete Guide
Introduction: This guide will walk you through creating and accessing Azure Blob Storage and integrating it with Azure Data Factory to automate data pipelines. From setting up a storage account and managing containers to configuring pipelines and transferring data to an Azure SQL Database, this step-by-step tutorial ensures you gain a comprehensive understanding of the process. Steps: 3. Click on + Create to initiate the creation of a new storage account. 4. Fill in the required fields like subscription, resource group, and region. Review all the settings before proceeding. 5. Create a Storage Account 6. Once the storage account is created, go to the resource by clicking on Go to Resource. 7. In the storage account, navigate to the Containers section and click + Container to create a new container for storing your files. 8. Click on the container you just created to access its contents. 9. Upload the desired JSON file into the container by clicking on Upload and selecting the file from your local system. 10. Ensure that the uploaded file is now listed in the container. 11. Go back to the Azure Portal and search for Azure Data Factory to open the ADF service. 12. From the ADF home screen, go to Author > Datasets. Click + New Dataset to create a new dataset for your Blob Storage. 13. Select the Azure Blob Storage dataset type, as you are working with data stored in Blob Storage. 14. Choose the data format that matches the file you uploaded, such as JSON, and click Continue. 15. Enter the necessary details for your dataset, including the file path and format settings. Select the appropriate Authentication type and specify the Storage account where the Blob Storage resides. Click Create to finalize the dataset creation. 16. Verify the settings and click OK to confirm the dataset configuration. 17. Navigate to the Pipelines section and click + New Pipeline to create a pipeline that will define your data flow. 18. Pipeline gets created successfully as shown below. 19. In the pipeline, select the dataset type as Azure SQL Database and click Continue to set up the SQL Database dataset. 20. Provide the necessary Linked Service details for your SQL database and click Create. 21. After configuring both the source and target datasets, and the pipeline, publish all the elements to save your work. 22. Once the pipeline is running successfully, you can verify its functionality by querying the destination database to ensure data is being transferred properly. a. Go to the SQL Database and select the relevant database. b. Select the database on which we have perform a query. c. Log in with your credentials. d. Write a simple test query to verify data has been transferred from Blob Storage to the SQL Database. Execute the query and confirm that the expected output is returned. Conclusion: Integrating Azure Blob Storage with Azure Data Factory is a powerful way to manage and automate data workflows in the cloud. This guide walks you through creating a storage account, configuring containers, uploading data, and designing a pipeline to process and transfer data to Azure SQL Database. By following these steps, you can efficiently handle large-scale data integration and ensure seamless communication between your data sources and destinations. Azure Data Factory not only simplifies the process of orchestrating data pipelines but also provides robust options for monitoring and optimizing workflows. Whether you are managing JSON files, processing transactional data, or setting up complex ETL processes, Azure’s ecosystem offers a reliable and scalable solution. Start exploring these tools today to unlock new possibilities in data-driven operations! We hope you found this blog useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfonts.com.
Share Story :
All-in-One Guide to C# Plugin Data Types: Working with Strings, Currency, Lookups, and More
In Dynamics 365 and Power Platform, C# plugins play a crucial role in extending the functionality of your applications. One of the key aspect’s developers need to grasp is how to handle various data types effectively. This guide will walk you through the most commonly used data types in C# plugins, including strings, currency, lookups, option sets, and more. Introduction to C# Plugins C# plugins are custom business logic that you can implement in Dynamics 365 to execute in response to specific events, like creating, updating, or deleting records. They allow you to manipulate data and interact with the system in powerful ways. Understanding how to work with different data types is essential for writing effective plugins. Retrieving Entities Before you can manipulate data types, you first need to retrieve the entity record you want to work with. Here’s how you can do that: public void Execute(IServiceProvider serviceProvider){IPluginExecutionContext context =(IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext)); IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory)); IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId); // Retrieve entity by IDEntity entity = service.Retrieve(“entity_logical_name”, context.PrimaryEntityId, new ColumnSet(true));} Working with Different Data Types String The most common data types are strings. They can represent text values. Getting a String Value: string name = entity.GetAttributeValue<string>(“string_attribute_name”); Setting a String Value: entity[“string_attribute_name”] = “New String Value”; Currency Currency is represented by the Money type in Dynamics 365. Getting a Currency Value: Money amount = entity.GetAttributeValue<Money>(“currency_attribute_name”);decimal currencyValue = amount?.Value ?? 0; Setting a Currency Value: entity[“currency_attribute_name”] = new Money(150.00m); // Set to 150.00 Lookups Lookup fields refer to related entities. They are represented by the EntityReference type. Getting a Lookup Value: EntityReference lookup = entity.GetAttributeValue<EntityReference>(“lookup_attribute_name”); if (lookup != null) { Guid lookupId = lookup.Id; string lookupName = lookup.Name; // Requires another retrieve call to get the name} Setting a Lookup Value: entity[“lookup_attribute_name”] = new EntityReference(“related_entity_logical_name”, lookupId); Option Sets (Picklists) Option sets are used to represent a list of choices. They are represented by the OptionSetValue type. Getting an Option Set Value: OptionSetValue optionSetValue = entity.GetAttributeValue<OptionSetValue>(“optionset_attribute_name”); int selectedValue = optionSetValue?.Value ?? 0; Setting an Option Set Value: entity[“optionset_attribute_name”] = new OptionSetValue(1); // Assuming 1 is a valid option Multiselect Option Set Multiselect option sets allow multiple selections from a list. Getting a Value: IEnumerable<OptionSetValue> multiSelectOptions = entity.GetAttributeValue<IEnumerable<OptionSetValue>>(“multiselect_optionset”); Setting a Value: entity[“multiselect_optionset”] = new List<OptionSetValue> { new OptionSetValue(1), new OptionSetValue(2) }; // Assuming 1 and 2 are valid options Boolean Values or Two Options Boolean fields represent true/false values. Getting a Boolean Value: bool? isActive = entity.GetAttributeValue<bool?>(“boolean_attribute_name”); Setting a Boolean Value: entity[“boolean_attribute_name”] = true; // or false Getting a Boolean Value: OptionSetValue twoOptionsValue = entity.GetAttributeValue<OptionSetValue>(“two_options_attribute”); bool isSelected = twoOptionsValue?.Value == 1; // Assuming 1 is ‘Yes’ Setting a Boolean Value: entity[“two_options_attribute”] = new OptionSetValue(1); // Set to ‘Yes’ DateTime DateTime fields are used for date and time information. Getting a DateTime Value: DateTime? createdOn = entity.GetAttributeValue<DateTime?>(“datetime_attribute_name”); Setting a DateTime Value: entity[“datetime_attribute_name”] = DateTime.UtcNow; // Set to current date and time Image Image fields store binary data like photos or documents. Getting an Image: var image = entity.GetAttributeValue<EntityImage>(“image_attribute”);byte[] imageData = image?.ImageData; // Assuming you handle the image type properly Whole Number Whole numbers are represented as int. Getting a Value: int wholeNumber = entity.GetAttributeValue<int>(“whole_number_attribute”); Setting a Value: entity[“whole_number_attribute”] = 42; Floating Point Number Floating point numbers allow for decimal values. Getting a Value: float floatingPointNumber = entity.GetAttributeValue<float>(“floating_point_attribute”); Setting a Value: entity[“floating_point_attribute”] = 3.14f; Decimal Number Decimal fields store precise decimal values. Getting a Value: decimal decimalNumber = entity.GetAttributeValue<decimal>(“decimal_number_attribute”); Setting a Value: entity[“decimal_number_attribute”] = 123.45m; Setting Data Types After retrieving the desired values and possibly making changes, you need to update the entity. This is done using the Update method. service.Update(entity); Some Additional Data Types you Might Encounter in Dynamics 365 C# Plugins such as : 1. Guid Getting a Value: Guid uniqueId = entity.GetAttributeValue<Guid>(“unique_attribute”); Setting a Value: entity[“unique_attribute”] = new Guid(“d3c1d9c8-7438-44b5-91b1-f40241b0f84d”); 2. Composite Fields Getting a Value: string city = entity.GetAttributeValue<string>(“address1_city”); string state = entity.GetAttributeValue<string>(“address1_stateorprovince”); Setting a Value: entity[“address1_city”] = “Seattle”;entity[“address1_stateorprovince”] = “WA”; 3. Unique Identifier (Primary Key) Getting a Value: Guid entityId = entity.Id; Setting a Value: Entity newEntity = new Entity(“entity_logical_name”) { Id = Guid.NewGuid() }; 4. PartyList (Used in Activities) Getting a Value: string city = entity.GetAttributeValue<string>(“address1_city”); EntityCollection partyList = entity.GetAttributeValue<EntityCollection>(“to”); foreach (Entity party in partyList.Entities) { EntityReference partyRef = party.GetAttributeValue<EntityReference>(“partyid”); // Do something with partyRef } Setting a Value: Entity party = new Entity(“activityparty”); party[“partyid”] = new EntityReference(“contact”, new Guid(“c3e4b159-64af-4c3d-b894-6d62007dbe79”)); EntityCollection partyList = new EntityCollection(new List<Entity> { party }); entity[“to”] = partyList; Conclusion Handling various data types in Dynamics 365 plugins is critical for creating robust and adaptable solutions. Knowing how to work with fields such as text, currencies, lookups, option sets, and others enables you to manage data more precisely and conduct custom activities in our CRM. With these examples, you now have the ability to retrieve and set values for a variety of common data types, making it easy to create plugins that match our organization’s requirements. As you gain experience, you’ll notice that handling data in plugins becomes easier, allowing you to focus on developing smart and effective solutions. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com